Conversation
| interface ModelMeta<TProviderOptions = unknown> { | ||
| name: string | ||
| id: string | ||
| supports: { |
There was a problem hiding this comment.
I think this array should reflect the table found here:
https://docs.aws.amazon.com/bedrock/latest/userguide/conversation-inference-supported-models-features.html
There was a problem hiding this comment.
@AlemTuzlak I've added this, but curious if this adapter should diverge from how we determine the modalities that a model has, when compared to other adapters (type map)?
I've left the type map for now.
There was a problem hiding this comment.
sorry, we didn't understand each other, yes what you added is what I wanted to see, but I meant in addition to input/output. For example in some model-meta i've done supports.capabilities.[all of the capabilities] and then you can do supports.input as well
|
|
||
| const US_LLAMA_4_SCOUT = { | ||
| name: 'Llama 4 Scout 17B Instruct (US)', | ||
| id: 'us.meta.llama4-scout-17b-instruct-v1:0', |
There was a problem hiding this comment.
the id here seems tied to the region?
There was a problem hiding this comment.
Yes some of the newer models have inference profiles, including Anthropic ones. https://docs.aws.amazon.com/bedrock/latest/userguide/inference-profiles-support.html
Maybe should have something typed for regions since it's consistent across all models that require it. I have not added any of the EU models, etc yet.
There was a problem hiding this comment.
To simplify, I've left only the ones where the inference profile is required for now.
|
The Bedrock SDK has been implemented for types only. Problem using it for runtime is that it will not work on Cloudflare due to a For this reason, it is using |
| * 2. `AWS_DEFAULT_REGION` environment variable | ||
| * 3. `'us-east-1'` as final fallback | ||
| */ | ||
| region?: string |
There was a problem hiding this comment.
can we extract the model region from here? so for example if it's eu/us/jp/sa and the model supports eu we can tell the user the model and the region don't match at runtime by throwing or default to a regionless model if it's supported?
eg:
model is anthropic, supported regions are eu and us, user specifies ca, it throws
model is anthropic, supported regions are eu and us, user specifices us, we append it to the model id before sending the request
model is anthropic, regionless, we send the request whatever region user specifies
There was a problem hiding this comment.
How's this? Added to the model config and check against that at runtime.
inferenceProfile: { regions: ['us', 'eu', 'apac'], required: true },
Where required is optional because some models require the inference profile and others do not.
🎯 Changes
✅ Checklist
pnpm run test:pr.🚀 Release Impact